Gradient Convergence in Gradient Methods
نویسندگان
چکیده
For the classical gradient method xt+1 = xt − γt∇f(xt) and several deterministic and stochastic variants, we discuss the issue of convergence of the gradient sequence ∇f(xt) and the attendant issue of stationarity of limit points of xt. We assume that ∇f is Lipschitz continuous, and that the stepsize γt diminishes to 0 and satisfies standard stochastic approximation conditions. We show that either f(xt) → −∞ or else f(xt) converges to a finite value and ∇f(xt) → 0 (with probability 1 in the stochastic case). Existing results assume various boundedness conditions such as boundedness from below of f , or boundedness of ∇f(xt), or boundedness of xt. 1 Research supported by NSF under Grant DMI-9625489 2 Dept. of Electrical Engineering and Computer Science, M.I.T., Cambridge, Mass., 02139. 1
منابع مشابه
A conjugate gradient based method for Decision Neural Network training
Decision Neural Network is a new approach for solving multi-objective decision-making problems based on artificial neural networks. Using inaccurate evaluation data, network training has improved and the number of educational data sets has decreased. The available training method is based on the gradient decent method (BP). One of its limitations is related to its convergence speed. Therefore,...
متن کاملA New Hybrid Conjugate Gradient Method Based on Eigenvalue Analysis for Unconstrained Optimization Problems
In this paper, two extended three-term conjugate gradient methods based on the Liu-Storey ({tt LS}) conjugate gradient method are presented to solve unconstrained optimization problems. A remarkable property of the proposed methods is that the search direction always satisfies the sufficient descent condition independent of line search method, based on eigenvalue analysis. The globa...
متن کاملApplication of frames in Chebyshev and conjugate gradient methods
Given a frame of a separable Hilbert space $H$, we present some iterative methods for solving an operator equation $Lu=f$, where $L$ is a bounded, invertible and symmetric operator on $H$. We present some algorithms based on the knowledge of frame bounds, Chebyshev method and conjugate gradient method, in order to give some approximated solutions to the problem. Then we i...
متن کاملA Class of Nested Iteration Schemes for Generalized Coupled Sylvester Matrix Equation
Global Krylov subspace methods are the most efficient and robust methods to solve generalized coupled Sylvester matrix equation. In this paper, we propose the nested splitting conjugate gradient process for solving this equation. This method has inner and outer iterations, which employs the generalized conjugate gradient method as an inner iteration to approximate each outer iterate, while each...
متن کاملAn Efficient Conjugate Gradient Algorithm for Unconstrained Optimization Problems
In this paper, an efficient conjugate gradient method for unconstrained optimization is introduced. Parameters of the method are obtained by solving an optimization problem, and using a variant of the modified secant condition. The new conjugate gradient parameter benefits from function information as well as gradient information in each iteration. The proposed method has global convergence und...
متن کاملComparison the Sensitivity Analysis and Conjugate Gradient algorithms for Optimization of Opening and Closing Angles of Valves to Reduce Fuel Consumption in XU7/L3 Engine
In this study it has been tried, to compare results and convergence rate of sensitivity analysis and conjugate gradient algorithms to reduce fuel consumption and increasing engine performance by optimizing the timing of opening and closing valves in XU7/L3 engine. In this study, considering the strength and accuracy of simulation GT-POWER software in researches on the internal combustion engine...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 1997